Analysis and Synthesis of Behavioural Specific Facial Motion

نویسنده

  • Lisa Nanette Gralewski
چکیده

This thesis presents work concerning the analysis of behaviour specific facial motion and its automatic synthesis. Psychology research has shown that facial motion provides important cues to the human visual system for recognition of emotion, identity and gender. Similarly, in Computer Vision facial motion information has been used in face and facial expression recognition. However, the fact that facial motion is behaviour specific has not yet been exploited in facial animation systems. Two parametric modelling techniques have been evaluated, MultiVariate AutoRegressive (VAR) temporal modelling and a tensor framework for modelling facial motion dynamics. Both modelling techniques adopt a ‘black box’ approach to facial motion modelling, where the emphasis of modelling is on motion information and not on textural information. Of these methods, VAR modelling is found to be more suitable for motion synthesis. Nevertheless, it is found that the tensor framework is more suited than VAR modelling as a potential tool for facial motion analysis. It is found that the VAR modelling technique encapsulated the temporal and motion dynamics of facial motion behaviour. VAR models constructed from behaviour specific facial motion generate facial motion sequences which are similar but non-identical to the original facial motion training data (i.e. ‘synthesis by example’). These sequences are novel and can be indefinitely long. Moreover, VAR model analysis demonstrated that the models themselves are behaviour specific. VAR models constructed from gender specific motion were found to have similar statistics to the original motion data. Using these models a human psychology experiment was conducted where it was found that participants could distinguish between synthetic male and female facial motion. Emotion specific VAR models were incorporated into a prototype animation tool. This tool enables a user to explore an interactive ‘emotion space’ and by traversing this space novel emotion specific sequences are automatically generated ‘on the fly’. The study shows that the tensor framework can encapsulate facial motion information. Using speed domain facial motion tensor framework (which encapsulates motion information only) can be used successfully for facial motion recognition (emotion and gender), with recognition greater than chance. Furthermore, it is found that tensor gender recognition results were comparable to those of a human psychology experiment when both utilised the same facial motion data. The results and observations of VAR modelling and tensor framework testing corroborate psychology and computer vision research that motion alone is sufficient to encapsulate emotion specific and gender specific information. Additionally, results indicate that emotion specific information is encoded in a shorter temporal period than gender/identity specific information. This has implications in the automatic synthesis of new facial motion sequences and possible animation systems which wish to exploit motion to generate believable realistic CG characters. In addition the recognition of emotion and gender from just motion/speed information could have relevance to security systems where textural information (low resolution imagery) is poor.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Analysis and Synthesis of Facial Expressions by Feature-Points Tracking and Deformable Model

Face expression recognition is useful for designing new interactive devices offering the possibility of new ways for human to interact with computer systems. In this paper we develop a facial expressions analysis and synthesis system. The analysis part of the system is based on the facial features extracted from facial feature points (FFP) in frontal image sequences. Selected facial feature poi...

متن کامل

Synthesis of human facial expressions based on the distribution of elastic force applied by control points

Facial expressions play an essential role in delivering emotions. Thus facial expression synthesis gain interests in many fields such as computer vision and graphics. Facial actions are generated by contraction and relaxation of the muscles innervated by facial nerves. The combination of those muscle motions is numerous. therefore, facial expressions are often person specific. But in general, f...

متن کامل

CuMn2O4 nanostructures: Facial synthesis, structural,magnetical, electrical characterization and activation energy calculation

The work is the report about stearic acid sol-gel synthesis method, magnetically, electricalcharacterization and activation energy of copper manganese oxide nanostructures. The CuMn2O4 nanostructures are synthesized at a temperature of 600°C using the sol-gel method. The structural analysis using X-ray diffraction (XRD) and Scherrer equation show that the crystallite size of CuMn2O4 is ab...

متن کامل

A Nonlinear Grayscale Morphological and Unsupervised method for Human Facial Synthesis Based on an Example Image

Human facial generation of example image is used as a requirement for biometric applications for the purpose of identifying individuals. In this paper, face generation consists of three main steps. In the first step, detection of significant lines and edges of the example image are carried out using nonlinear grayscale morphology. Then, hair areas are identified from the face of sample. The fin...

متن کامل

Modelling, Classification and Synthesis of Facial Expressions

The field of computer vision endeavours to develop automatic approaches to the interpretation of images from the real world. Over the past number of decades researchers within this field have created systems specifically for the automatic analysis of facial expression. The most successful of these approaches draw on the tools from behavioural science. In this chapter we examine facial expressio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007